Natural language grammatical inference with recurrent neural networks

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Natural Language Grammatical Inference with Recurrent Neural Networks

This paper examines the inductive inference of a complex grammar with neural networks – specifically, the task considered is that of training a network to classify natural language sentences as grammatical or ungrammatical, thereby exhibiting the same kind of discriminatory power provided by the Principles and Parameters linguistic framework, or Government-and-Binding theory. Neural networks ar...

متن کامل

Discrete recurrent neural networks for grammatical inference

Describes a novel neural architecture for learning deterministic context-free grammars, or equivalently, deterministic pushdown automata. The unique feature of the proposed network is that it forms stable state representations during learning-previous work has shown that conventional analog recurrent networks can be inherently unstable in that they cannot retain their state memory for long inpu...

متن کامل

Natural language grammatical inference: a comparison of recurrent neural networks and machine learning methods

We consider the task of training a neural network to classify natural language sentences as grammatical or ungrammatical, thereby exhibiting the same kind of discriminatory power provided by the Principles and Parameters linguistic framework, or Government and Binding theory. We investigate the following models: feed-forward neural networks, Frasconi-Gori-Soda and Back-Tsoi locally recurrent ne...

متن کامل

Neural Networks for Natural Language Inference

Predicting whether a sentence entails another sentence, contradicts another sentence, or is in a neutral entailment relation with another sentence is both an important NLP task as well as a sophisticated way of testing semantic sentence encoding models. In this project, I evaluate three sentence encoding models on the Stanford Natural Language Inference (SNLI) corpus. In particular, I investiga...

متن کامل

Natural Language Recursion and Recurrent Neural Networks

The recursive structure of natural language was one of the principal, and most telling, sources of diiculty for associationist models of linguistic behaviour. It has, more recently, become a focus in the debate surrounding the generality of neural network models of language, which many would regard as the natural heirs of the associationist legacy. Can neural networks learn to handle recursive ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Knowledge and Data Engineering

سال: 2000

ISSN: 1041-4347

DOI: 10.1109/69.842255